69 research outputs found

    Should patients with brain implants undergo MRI?

    Get PDF
    Patients suffering from neuronal degenerative diseases are increasingly being equipped with neural implants to treat symptoms or restore functions and increase their quality of life. Magnetic resonance imaging (MRI) would be the modality of choice for diagnosis and compulsory post-operative monitoring of such patients. However, interactions between the MR environment and implants pose severe health risks to the patient. Nevertheless, neural implant recipients regularly underwent MRI examinations, and adverse events were reported rarely. This should not imply that the procedures are safe. More than 300.000 cochlear implant recipients are excluded from MRI unless the indication outweighs excruciating pain. For 75.000 DBS recipients quite the opposite holds: MRI is considered essential part of the implantation procedure and some medical centres deliberately exceed safety regulations which they referred to as crucially impractical. MRI related permanent neurological dysfunctions in DBS recipients have occurred in the past when manufacturer recommendations were exceeded. Within the last decades extensive effort has been invested to identify, characterise, and quantify the occurring interactions. Today we are far from a satisfying solution to achieve a safe and beneficial MR procedure for all implant recipients. To contribute, we intend to raise awareness of a growing concern and want to summon the community to stop absurdities and instead improve the situation for the increasing number of patients. Therefore, we review implant safety in the MRI literature from an engineering point of view, with a focus on cochlear and DBS implants as success stories in clinical practice. We briefly explain fundamental phenomena which can lead to patient harm, and point out breakthroughs and errors made. We end with conclusions and strategies to avoid future implants from being contraindicated to MR examinations. We believe that implant recipients should enter MRI, but before doing so, we should make sure that the procedure is reasonable

    Earth Virtualization Engines -- A Technical Perspective

    Full text link
    Participants of the Berlin Summit on Earth Virtualization Engines (EVEs) discussed ideas and concepts to improve our ability to cope with climate change. EVEs aim to provide interactive and accessible climate simulations and data for a wide range of users. They combine high-resolution physics-based models with machine learning techniques to improve the fidelity, efficiency, and interpretability of climate projections. At their core, EVEs offer a federated data layer that enables simple and fast access to exabyte-sized climate data through simple interfaces. In this article, we summarize the technical challenges and opportunities for developing EVEs, and argue that they are essential for addressing the consequences of climate change

    The ESCAPE project : Energy-efficient Scalable Algorithms for Weather Prediction at Exascale

    Get PDF
    In the simulation of complex multi-scale flows arising in weather and climate modelling, one of the biggest challenges is to satisfy strict service requirements in terms of time to solution and to satisfy budgetary constraints in terms of energy to solution, without compromising the accuracy and stability of the application. These simulations require algorithms that minimise the energy footprint along with the time required to produce a solution, maintain the physically required level of accuracy, are numerically stable, and are resilient in case of hardware failure. The European Centre for Medium-Range Weather Forecasts (ECMWF) led the ESCAPE (Energy-efficient Scalable Algorithms for Weather Prediction at Exascale) project, funded by Horizon 2020 (H2020) under the FET-HPC (Future and Emerging Technologies in High Performance Computing) initiative. The goal of ESCAPE was to develop a sustainable strategy to evolve weather and climate prediction models to next-generation computing technologies. The project partners incorporate the expertise of leading European regional forecasting consortia, university research, experienced high-performance computing centres, and hardware vendors. This paper presents an overview of the ESCAPE strategy: (i) identify domain-specific key algorithmic motifs in weather prediction and climate models (which we term Weather & Climate Dwarfs), (ii) categorise them in terms of computational and communication patterns while (iii) adapting them to different hardware architectures with alternative programming models, (iv) analyse the challenges in optimising, and (v) find alternative algorithms for the same scheme. The participating weather prediction models are the following: IFS (Integrated Forecasting System); ALARO, a combination of AROME (Application de la Recherche a l'Operationnel a Meso-Echelle) and ALADIN (Aire Limitee Adaptation Dynamique Developpement International); and COSMO-EULAG, a combination of COSMO (Consortium for Small-scale Modeling) and EULAG (Eulerian and semi-Lagrangian fluid solver). For many of the weather and climate dwarfs ESCAPE provides prototype implementations on different hardware architectures (mainly Intel Skylake CPUs, NVIDIA GPUs, Intel Xeon Phi, Optalysys optical processor) with different programming models. The spectral transform dwarf represents a detailed example of the co-design cycle of an ESCAPE dwarf. The dwarf concept has proven to be extremely useful for the rapid prototyping of alternative algorithms and their interaction with hardware; e.g. the use of a domain-specific language (DSL). Manual adaptations have led to substantial accelerations of key algorithms in numerical weather prediction (NWP) but are not a general recipe for the performance portability of complex NWP models. Existing DSLs are found to require further evolution but are promising tools for achieving the latter. Measurements of energy and time to solution suggest that a future focus needs to be on exploiting the simultaneous use of all available resources in hybrid CPU-GPU arrangements

    Biocompatibility and Bone Formation of Flexible, Cotton Wool-like PLGA/Calcium Phosphate Nanocomposites in Sheep

    Get PDF
    BACKGROUND: The purpose of this preliminary study was to assess the in vivo performance of synthetic, cotton wool-like nanocomposites consisting of a biodegradable poly(lactide-co-glycolide) fibrous matrix and containing either calcium phosphate nanoparticles (PLGA/CaP 60:40) or silver doped CaP nanoparticles (PLGA/Ag-CaP 60:40). Besides its extraordinary in vitro bioactivity the latter biomaterial (0.4 wt% total silver concentration) provides additional antimicrobial properties for treating bone defects exposed to microorganisms. MATERIALS AND METHODS: Both flexible artificial bone substitutes were implanted into totally 16 epiphyseal and metaphyseal drill hole defects of long bone in sheep and followed for 8 weeks. Histological and histomorphological analyses were conducted to evaluate the biocompatibility and bone formation applying a score system. The influence of silver on the in vivo performance was further investigated. RESULTS: Semi-quantitative evaluation of histology sections showed for both implant materials an excellent biocompatibility and bone healing with no resorption in the adjacent bone. No signs of inflammation were detectable, either macroscopically or microscopically, as was evident in 5 µm plastic sections by the minimal amount of inflammatory cells. The fibrous biomaterials enabled bone formation directly in the centre of the former defect. The area fraction of new bone formation as determined histomorphometrically after 8 weeks implantation was very similar with 20.5 ± 11.2 % and 22.5 ± 9.2 % for PLGA/CaP and PLGA/Ag-CaP, respectively. CONCLUSIONS: The cotton wool-like bone substitute material is easily applicable, biocompatible and might be beneficial in minimal invasive surgery for treating bone defects

    Earth Virtualization Engines: a technical perspective

    Get PDF
    Participants of the Berlin Summit on Earth Virtualization Engines (EVEs) discussed ideas and concepts to improve our ability to cope with climate change. EVEs aim to provide interactive and accessible climate simulations and data for a wide range of users. They combine high-resolution physics-based models with machine learning techniques to improve the fidelity, efficiency, and interpretability of climate projections. At their core, EVEs offer a federated data layer that enables simple and fast access to exabyte-sized climate data through simple interfaces. In this article, we summarize the technical challenges and opportunities for developing EVEs, and argue that they are essential for addressing the consequences of climate change

    The ESCAPE project: Energy-efficient Scalable Algorithms for Weather Prediction at Exascale

    Get PDF
    Abstract. In the simulation of complex multi-scale flows arising in weather and climate modelling, one of the biggest challenges is to satisfy strict service requirements in terms of time to solution and to satisfy budgetary constraints in terms of energy to solution, without compromising the accuracy and stability of the application. These simulations require algorithms that minimise the energy footprint along with the time required to produce a solution, maintain the physically required level of accuracy, are numerically stable, and are resilient in case of hardware failure. The European Centre for Medium-Range Weather Forecasts (ECMWF) led the ESCAPE (Energy-efficient Scalable Algorithms for Weather Prediction at Exascale) project, funded by Horizon 2020 (H2020) under the FET-HPC (Future and Emerging Technologies in High Performance Computing) initiative. The goal of ESCAPE was to develop a sustainable strategy to evolve weather and climate prediction models to next-generation computing technologies. The project partners incorporate the expertise of leading European regional forecasting consortia, university research, experienced high-performance computing centres, and hardware vendors. This paper presents an overview of the ESCAPE strategy: (i) identify domain-specific key algorithmic motifs in weather prediction and climate models (which we term Weather & Climate Dwarfs), (ii) categorise them in terms of computational and communication patterns while (iii) adapting them to different hardware architectures with alternative programming models, (iv) analyse the challenges in optimising, and (v) find alternative algorithms for the same scheme. The participating weather prediction models are the following: IFS (Integrated Forecasting System); ALARO, a combination of AROME (Application de la Recherche à l'Opérationnel à Meso-Echelle) and ALADIN (Aire Limitée Adaptation Dynamique Développement International); and COSMO–EULAG, a combination of COSMO (Consortium for Small-scale Modeling) and EULAG (Eulerian and semi-Lagrangian fluid solver). For many of the weather and climate dwarfs ESCAPE provides prototype implementations on different hardware architectures (mainly Intel Skylake CPUs, NVIDIA GPUs, Intel Xeon Phi, Optalysys optical processor) with different programming models. The spectral transform dwarf represents a detailed example of the co-design cycle of an ESCAPE dwarf. The dwarf concept has proven to be extremely useful for the rapid prototyping of alternative algorithms and their interaction with hardware; e.g. the use of a domain-specific language (DSL). Manual adaptations have led to substantial accelerations of key algorithms in numerical weather prediction (NWP) but are not a general recipe for the performance portability of complex NWP models. Existing DSLs are found to require further evolution but are promising tools for achieving the latter. Measurements of energy and time to solution suggest that a future focus needs to be on exploiting the simultaneous use of all available resources in hybrid CPU–GPU arrangements

    Scientific, sustainability and regulatory challenges of cultured meat

    Get PDF
    Producing meat without the drawbacks of conventional animal agriculture would greatly contribute to future food and nutrition security. This Review Article covers biological, technological, regulatory and consumer acceptance challenges in this developing field of biotechnology. Cellular agriculture is an emerging branch of biotechnology that aims to address issues associated with the environmental impact, animal welfare and sustainability challenges of conventional animal farming for meat production. Cultured meat can be produced by applying current cell culture practices and biomanufacturing methods and utilizing mammalian cell lines and cell and gene therapy products to generate tissue or nutritional proteins for human consumption. However, significant improvements and modifications are needed for the process to be cost efficient and robust enough to be brought to production at scale for food supply. Here, we review the scientific and social challenges in transforming cultured meat into a viable commercial option, covering aspects from cell selection and medium optimization to biomaterials, tissue engineering, regulation and consumer acceptance
    corecore